Goto

Collaborating Authors

 logistic regression model




A Proof of Proposition 1 Proof: First, it is straightforward to show that the IPW estimator of the ground truth treatment effect ˆ δ

Neural Information Processing Systems

We proceed to compute the variances of each estimator. The proof also holds for the non-zero mean case trivially. Causal model details for Section 5.2 In Section 5.2, We include a wide range of machine learning-based causal inference methods to evaluate the performance of causal error estimators. Others configs are kept as default. The others are kept as default.





Appendix

Neural Information Processing Systems

In practice, building f and g requires the computation for wtiwtj for all i,j. B.2 Classification For the classification task with the logistic regression model, we modify the formula of logistic regression in teaching objectives to make it convenient for derivation. It also indicates that with probability at least p1, the LST teacher can achieve exponential teachability in the iteration t. In order to achieve exponential teachiability in T iterations, the sufficient condition in Eq. (22) must be satisfied in all T iterations. Then, we use a pre-trained DenseNet [65] shown in [53] to generate 1024 dim features and the confidencescoreforeachimage.